Generalized Maximum Entropy
نویسندگان
چکیده
A long standing mystery in using Maximum Entropy (MaxEnt) is how to deal with constraints whose values are uncertain. This situation arises when constraint values are estimated from data, because of finite sample sizes. One approach to this probiem, advocated by E.T. Jaynes [1], is to igriore this uncertainty, and treat the empirically observed values as exact. We refer to this as the classic MaxEnt approach. Classic MaxEnt gives point probabilities (subject to the given constraints), rather than probability densities. We develop an alternative approach that assumes that the uncertain constraint values are represeritecfOy-a-pf60ahillty densiW{e.g:-a Gaus-slaii)~alldtliis uricerfamW-----------yields a MaxEnt posterior probability density. That is, the classic MaxEnt point probabilities are regarded as a multidimensional function of the given constraint values, and uncertainty on these values is transmitted through the MaxEnt function to give uncertainty over the MaXEnt probabilities. We illustrate this approach by explicitly calculating the generalized MaxEnt density for a simple but common case, then show how this can be extended numerically to the general case. This paper expands the generalized MaxEnt concept introduced in a previous paper [3].
منابع مشابه
A Note on the Bivariate Maximum Entropy Modeling
Let X=(X1 ,X2 ) be a continuous random vector. Under the assumption that the marginal distributions of X1 and X2 are given, we develop models for vector X when there is partial information about the dependence structure between X1 and X2. The models which are obtained based on well-known Principle of Maximum Entropy are called the maximum entropy (ME) mo...
متن کاملA mathematical review of the generalized entropies and their matrix trace inequalities
We review the properties of the generalized entropies in our previous papers in the following way. (1)A generalized Fannes’ inequality is shown by the axiomatically characterized Tsallis entropy. (2)The maximum entropy principles in nonextensive statistical physics are revisited as an application of the Tsallis relative entropy defined for the nonnegative matrices in the framework of matrix ana...
متن کاملTsallis distribution as a standard maximum entropy solution with ‘tail’ constraint
We show that Tsallis’ distributions can be derived from the standard (Shannon) maximum entropy setting, by incorporating a constraint on the divergence between the distribution and another distribution imagined as its tail. In this setting, we find an underlying entropy which is the Rényi entropy. Furthermore, escort distributions and generalized means appear as a direct consequence of the cons...
متن کاملGeneralized Entropy Concentration for Counts
We consider the phenomenon of entropy concentration under linear constraints in a discrete setting, using the “balls and bins” paradigm, but without the assumption that the number of balls allocated to the bins is known. Therefore instead of frequency vectors and ordinary entropy, we have count vectors with unknown sum, and a certain generalized entropy. We show that if the constraints bound th...
متن کاملOn Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions
X dP dμ ln dP dμ dμ on a measure space (X,M, μ), does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We prese...
متن کاملA Generalized Iterative Scaling Algorithm for Maximum Entropy Reasoning in Relational Probabilistic Conditional Logic Under Aggregation Semantics
Recently, different semantics for relational probabilistic conditionals and corresponding maximum entropy (ME) inference operators have been proposed. In this paper, we study the so-called aggregation semantics that covers both notions of a statistical and subjective view. The computation of its inference operator requires the calculation of the ME-distribution satisfying all probabilistic cond...
متن کامل